1,371 research outputs found

    Differential Privacy for the Analyst via Private Equilibrium Computation

    Full text link
    We give new mechanisms for answering exponentially many queries from multiple analysts on a private database, while protecting differential privacy both for the individuals in the database and for the analysts. That is, our mechanism's answer to each query is nearly insensitive to changes in the queries asked by other analysts. Our mechanism is the first to offer differential privacy on the joint distribution over analysts' answers, providing privacy for data analysts even if the other data analysts collude or register multiple accounts. In some settings, we are able to achieve nearly optimal error rates (even compared to mechanisms which do not offer analyst privacy), and we are able to extend our techniques to handle non-linear queries. Our analysis is based on a novel view of the private query-release problem as a two-player zero-sum game, which may be of independent interest

    Probabilistic Couplings For Probabilistic Reasoning

    Get PDF
    This thesis explores proofs by coupling from the perspective of formal verification. Long employed in probability theory and theoretical computer science, these proofs construct couplings between the output distributions of two probabilistic processes. Couplings can imply various probabilistic relational properties, guarantees that compare two runs of a probabilistic computation. To give a formal account of this clean proof technique, we first show that proofs in the program logic pRHL (probabilistic Relational Hoare Logic) describe couplings. We formalize couplings that establish various probabilistic properties, including distribution equivalence, convergence, and stochastic domination. Then we deepen the connection between couplings and pRHL by giving a proofs-as-programs interpretation: a coupling proof encodes a probabilistic product program, whose properties imply relational properties of the original two programs. We design the logic xpRHL (product pRHL) to build the product program, with extensions to model more advanced constructions including shift coupling and path coupling. We then develop an approximate version of probabilistic coupling, based on approximate liftings. It is known that the existence of an approximate lifting implies differential privacy, a relational notion of statistical privacy. We propose a corresponding proof technique---proof by approximate coupling---inspired by the logic apRHL, a version of pRHL for building approximate liftings. Drawing on ideas from existing privacy proofs, we extend apRHL with novel proof rules for constructing new approximate couplings. We give approximate coupling proofs of privacy for the Report-noisy-max and Sparse Vector mechanisms, well-known algorithms from the privacy literature with notoriously subtle privacy proofs, and produce the first formalized proof of privacy for these algorithms in apRHL. Finally, we enrich the theory of approximate couplings with several more sophisticated constructions: a principle for showing accuracy-dependent privacy, a generalization of the advanced composition theorem from differential privacy, and an optimal approximate coupling relating two subsets of samples. We also show equivalences between approximate couplings and other existing definitions. These ingredients support the first formalized proof of privacy for the Between Thresholds mechanism, an extension of the Sparse Vector mechanism

    Hypothesis Testing Interpretations and Renyi Differential Privacy

    Full text link
    Differential privacy is a de facto standard in data privacy, with applications in the public and private sectors. A way to explain differential privacy, which is particularly appealing to statistician and social scientists is by means of its statistical hypothesis testing interpretation. Informally, one cannot effectively test whether a specific individual has contributed her data by observing the output of a private mechanism---any test cannot have both high significance and high power. In this paper, we identify some conditions under which a privacy definition given in terms of a statistical divergence satisfies a similar interpretation. These conditions are useful to analyze the distinguishability power of divergences and we use them to study the hypothesis testing interpretation of some relaxations of differential privacy based on Renyi divergence. This analysis also results in an improved conversion rule between these definitions and differential privacy
    • …
    corecore